85 research outputs found

    The XMM Cluster Survey: Forecasting cosmological and cluster scaling-relation parameter constraints

    Get PDF
    We forecast the constraints on the values of sigma_8, Omega_m, and cluster scaling relation parameters which we expect to obtain from the XMM Cluster Survey (XCS). We assume a flat Lambda-CDM Universe and perform a Monte Carlo Markov Chain analysis of the evolution of the number density of galaxy clusters that takes into account a detailed simulated selection function. Comparing our current observed number of clusters shows good agreement with predictions. We determine the expected degradation of the constraints as a result of self-calibrating the luminosity-temperature relation (with scatter), including temperature measurement errors, and relying on photometric methods for the estimation of galaxy cluster redshifts. We examine the effects of systematic errors in scaling relation and measurement error assumptions. Using only (T,z) self-calibration, we expect to measure Omega_m to +-0.03 (and Omega_Lambda to the same accuracy assuming flatness), and sigma_8 to +-0.05, also constraining the normalization and slope of the luminosity-temperature relation to +-6 and +-13 per cent (at 1sigma) respectively in the process. Self-calibration fails to jointly constrain the scatter and redshift evolution of the luminosity-temperature relation significantly. Additional archival and/or follow-up data will improve on this. We do not expect measurement errors or imperfect knowledge of their distribution to degrade constraints significantly. Scaling-relation systematics can easily lead to cosmological constraints 2sigma or more away from the fiducial model. Our treatment is the first exact treatment to this level of detail, and introduces a new `smoothed ML' estimate of expected constraints.Comment: 28 pages, 17 figures. Revised version, as accepted for publication in MNRAS. High-resolution figures available at http://xcs-home.org (under "Publications"

    The XMM Cluster Survey: X-ray analysis methodology

    Get PDF
    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM-Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5,776 XMM observations used to construct the current XCS source catalogue. A total of 3,675 > 4-sigma cluster candidates with > 50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg^2. Of these, 993 candidates are detected with > 300 background-subtracted X-ray photon counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these candidates, as well as to estimate redshifts from the X-ray data alone. A total of 587 (122) X-ray temperatures to a typical accuracy of < 40 (< 10) per cent have been measured to date. We also present the methodology adopted for determining the selection function of the survey, and show that the extended source detection algorithm is robust to a range of cluster morphologies by inserting mock clusters derived from hydrodynamical simulations into real XMM images. These tests show that the simple isothermal beta-profiles is sufficient to capture the essential details of the cluster population detected in the archival XMM observations. The redshift follow-up of the XCS cluster sample is presented in a companion paper, together with a first data release of 503 optically-confirmed clusters.Comment: MNRAS accepted, 45 pages, 38 figures. Our companion paper describing our optical analysis methodology and presenting a first set of confirmed clusters has now been submitted to MNRA

    SARS-CoV-2 evolution during treatment of chronic infection

    Get PDF
    SARS-CoV-2 Spike protein is critical for virus infection via engagement of ACE21, and is a major 54 antibody target. Here we report chronic SARS-CoV-2 with reduced sensitivity to neutralising 55 antibodies in an immune suppressed individual treated with convalescent plasma, generating 56 whole genome ultradeep sequences over 23 time points spanning 101 days. Little change was 57 observed in the overall viral population structure following two courses of remdesivir over the 58 first 57 days. However, following convalescent plasma therapy we observed large, dynamic 59 virus population shifts, with the emergence of a dominant viral strain bearing D796H in S2 and 60 H69/V70 in the S1 N-terminal domain NTD of the Spike protein. As passively transferred 61 serum antibodies diminished, viruses with the escape genotype diminished in frequency, before 62 returning during a final, unsuccessful course of convalescent plasma. In vitro, the Spike escape 63 double mutant bearing H69/V70 and D796H conferred modestly decreased sensitivity to 64 convalescent plasma, whilst maintaining infectivity similar to wild type. D796H appeared to be 65 the main contributor to decreased susceptibility but incurred an infectivity defect. The 66 H69/V70 single mutant had two-fold higher infectivity compared to wild type, possibly 67 compensating for the reduced infectivity of D796H. These data reveal strong selection on SARS68 CoV-2 during convalescent plasma therapy associated with emergence of viral variants with 69 evidence of reduced susceptibility to neutralising antibodies.COG-UK is supported by funding from the Medical Research Council (MRC) part of UK Research & Innovation (UKRI), the National Institute of Health Research (NIHR) and Genome Research Limited, operating as the Wellcome Sanger Institute

    A survey and classification of software-defined storage systems

    Get PDF
    The exponential growth of digital information is imposing increasing scale and efficiency demands on modern storage infrastructures. As infrastructure complexity increases, so does the difficulty in ensuring quality of service, maintainability, and resource fairness, raising unprecedented performance, scalability, and programmability challenges. Software-Defined Storage (SDS) addresses these challenges by cleanly disentangling control and data flows, easing management, and improving control functionality of conventional storage systems. Despite its momentum in the research community, many aspects of the paradigm are still unclear, undefined, and unexplored, leading to misunderstandings that hamper the research and development of novel SDS technologies. In this article, we present an in-depth study of SDS systems, providing a thorough description and categorization of each plane of functionality. Further, we propose a taxonomy and classification of existing SDS solutions according to different criteria. Finally, we provide key insights about the paradigm and discuss potential future research directions for the field.This work was financed by the Portuguese funding agency FCT-Fundacao para a Ciencia e a Tecnologia through national funds, the PhD grant SFRH/BD/146059/2019, the project ThreatAdapt (FCT-FNR/0002/2018), the LASIGE Research Unit (UIDB/00408/2020), and cofunded by the FEDER, where applicable

    Description and performance of track and primary-vertex reconstruction with the CMS tracker

    Get PDF
    A description is provided of the software algorithms developed for the CMS tracker both for reconstructing charged-particle trajectories in proton-proton interactions and for using the resulting tracks to estimate the positions of the LHC luminous region and individual primary-interaction vertices. Despite the very hostile environment at the LHC, the performance obtained with these algorithms is found to be excellent. For tbar t events under typical 2011 pileup conditions, the average track-reconstruction efficiency for promptly-produced charged particles with transverse momenta of pT > 0.9GeV is 94% for pseudorapidities of |η| < 0.9 and 85% for 0.9 < |η| < 2.5. The inefficiency is caused mainly by hadrons that undergo nuclear interactions in the tracker material. For isolated muons, the corresponding efficiencies are essentially 100%. For isolated muons of pT = 100GeV emitted at |η| < 1.4, the resolutions are approximately 2.8% in pT, and respectively, 10μm and 30μm in the transverse and longitudinal impact parameters. The position resolution achieved for reconstructed primary vertices that correspond to interesting pp collisions is 10–12μm in each of the three spatial dimensions. The tracking and vertexing software is fast and flexible, and easily adaptable to other functions, such as fast tracking for the trigger, or dedicated tracking for electrons that takes into account bremsstrahlung

    Alignment of the CMS tracker with LHC and cosmic ray data

    Get PDF
    © CERN 2014 for the benefit of the CMS collaboration, published under the terms of the Creative Commons Attribution 3.0 License by IOP Publishing Ltd and Sissa Medialab srl. Any further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation and DOI.The central component of the CMS detector is the largest silicon tracker ever built. The precise alignment of this complex device is a formidable challenge, and only achievable with a significant extension of the technologies routinely used for tracking detectors in the past. This article describes the full-scale alignment procedure as it is used during LHC operations. Among the specific features of the method are the simultaneous determination of up to 200 000 alignment parameters with tracks, the measurement of individual sensor curvature parameters, the control of systematic misalignment effects, and the implementation of the whole procedure in a multi-processor environment for high execution speed. Overall, the achieved statistical accuracy on the module alignment is found to be significantly better than 10μm

    Hitomi (ASTRO-H) X-ray Astronomy Satellite

    Get PDF
    The Hitomi (ASTRO-H) mission is the sixth Japanese x-ray astronomy satellite developed by a large international collaboration, including Japan, USA, Canada, and Europe. The mission aimed to provide the highest energy resolution ever achieved at E  >  2  keV, using a microcalorimeter instrument, and to cover a wide energy range spanning four decades in energy from soft x-rays to gamma rays. After a successful launch on February 17, 2016, the spacecraft lost its function on March 26, 2016, but the commissioning phase for about a month provided valuable information on the onboard instruments and the spacecraft system, including astrophysical results obtained from first light observations. The paper describes the Hitomi (ASTRO-H) mission, its capabilities, the initial operation, and the instruments/spacecraft performances confirmed during the commissioning operations for about a month

    Towards higher disk head utilization: extracting free bandwidth from busy disk drives

    No full text
    Freeblock scheduling is a new approach to utilizing more of disks ’ potential media bandwidths. By filling rotational latency periods with useful media transfers, 20-50 % of a never-idle disk’s bandwidth can often be provided to background applications with no effect on foreground response times. This paper describes freeblock scheduling and demonstrates its value with two concrete applications: free segment cleaning and free data mining. Free segment cleaning often allows an LFS file system to maintain its ideal write performance when cleaning overheads would otherwise cause up to factor of 3 performance decreases. Free data mining can achieve 45-70 full disk scans per day on an active transaction processing system, with no effect on transaction performance.
    corecore